# Text Representation Learning
Nezha Base Wwm
NEZHA is a Chinese pre-trained language model based on the Transformer architecture, optimized for Chinese text understanding tasks using the whole word masking strategy
Large Language Model
Transformers

N
sijunhe
66
2
Nezha Cn Base
NEZHA is a neural contextualized representation model for Chinese understanding, based on the Transformer architecture, developed by Huawei Noah's Ark Lab.
Large Language Model
Transformers

N
sijunhe
1,443
12
Bert Base Uncased Mlm
Apache-2.0
A masked language model (MLM) fine-tuned based on bert-base-uncased, suitable for text filling tasks
Large Language Model
Transformers

B
wypoon
25
0
Mminilmv2 L12 H384 Distilled From XLMR Large
MiniLMv2 is a lightweight multilingual pre-trained model developed by Microsoft Research, based on the Transformer architecture, suitable for various natural language processing tasks.
Large Language Model
Transformers

M
nreimers
21.39k
17
Xlm Mlm En 2048
XLM is a masked language model trained on English text, using BERT-style MLM objectives for pretraining, supporting English language processing tasks.
Large Language Model
Transformers English

X
FacebookAI
1,734
0
Featured Recommended AI Models